L2-Nonexpansive Neural Networks

نویسندگان

  • Haifeng Qian
  • Mark N. Wegman
چکیده

This paper proposes a class of well-conditioned neural networks in which a unit amount of change in the inputs causes at most a unit amount of change in the outputs or any of the internal layers. We develop the known methodology of controlling Lipschitz constants to realize its full potential in maximizing robustness: our linear and convolution layers subsume those in the previous Parseval networks as a special case and allow greater degrees of freedom; aggregation, pooling, splitting and other operators are adapted in new ways, and a new loss function is proposed, all for the purpose of improving robustness. With MNIST and CIFAR-10 classifiers, we demonstrate a number of advantages. Without needing any adversarial training, the proposed classifiers exceed the state of the art in robustness against white-box L2-bounded adversarial attacks. Their outputs are quantitatively more meaningful than ordinary networks and indicate levels of confidence. They are also free of exploding gradients, among other desirable properties.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

L2 −l∞ Filtering for Time-delayed Switched Hopfield Neural Networks

This paper investigates the delay-dependent L2 − L∞ filtering problem for time-delayed switched Hopfield neural networks. A new type of L2−L∞ filter is proposed such that the filtering error system is asymptotically stable with guaranteed L2 − L∞ performance. The criterion is formulated in terms of linear matrix inequalities (LMIs), which can be checked readily by using certain types of standar...

متن کامل

New Sets of Criteria for Exponential L2 −l∞ Stability of Takagi-sugeno Fuzzy Systems Combined with Hopfield Neural Networks

In this paper, we propose new sets of criteria for exponential robust stability of Takagi-Sugeno (T-S) fuzzy Hopfield neural networks. The L2−L∞ approach is applied to obtain new sets of stability criteria, under which T-S fuzzy Hopfield neural networks reduce the effect of external input to a prescribed level. These sets of criteria are presented based on the matrix norm and linear matrix ineq...

متن کامل

l2-l∞ Filtering for Discrete Time-Delay Markovian Jump Neural Networks

This paper considers the l2-l∞ filter problem for discrete time-delay Markovian jump neural networks. Attention is focused on the design of a reduced-order filter to guarantee stochastic stability and a prescribed l2-l∞ performance for the filtering error system. In terms of linear matrix inequalities (LMIs), a delay-dependent sufficient condition for the solvability of the addressed problem is...

متن کامل

Nonasymptotic bounds on the L2 error of neural network regression estimates

The estimation of multivariate regression functions from bounded i.i.d. data is considered. TheL2 error with integration with respect to the design measure is used as an error criterion. The distribution of the design is assumed to be concentrated on a finite set. Neural network estimates are defined by minimizing the empirical L2 risk over various sets of feedforward neural networks. Nonasympt...

متن کامل

Exploring Lexical Network Development in Second Language Learners

This study explores how neural network models can simulate word production in second language (L2) learners. A neural network was trained to simulate L2 word production using a variety of word properties related to connectionist networks (hypernymy, polysemy, concreteness, and meaningfulness). The study demonstrates that a neural network can produce words to a similar degree as L2 learners. The...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1802.07896  شماره 

صفحات  -

تاریخ انتشار 2018